78 research outputs found

    Divergent mathematical treatments in utility theory

    Get PDF
    In this paper I study how divergent mathematical treatments affect mathematical modelling, with a special focus on utility theory. In particular I examine recent work on the ranking of information states and the discounting of future utilities, in order to show how, by replacing the standard analytical treatment of the models involved with one based on the framework of Nonstandard Analysis, diametrically opposite results are obtained. In both cases, the choice between the standard and nonstandard treatment amounts to a selection of set-theoretical parameters that cannot be made on purely empirical grounds. The analysis of this phenomenon gives rise to a simple logical account of the relativity of impossibility theorems in economic theory, which concludes the paper

    Leibniz's Infinitesimals: Their Fictionality, Their Modern Implementations, And Their Foes From Berkeley To Russell And Beyond

    Full text link
    Many historians of the calculus deny significant continuity between infinitesimal calculus of the 17th century and 20th century developments such as Robinson's theory. Robinson's hyperreals, while providing a consistent theory of infinitesimals, require the resources of modern logic; thus many commentators are comfortable denying a historical continuity. A notable exception is Robinson himself, whose identification with the Leibnizian tradition inspired Lakatos, Laugwitz, and others to consider the history of the infinitesimal in a more favorable light. Inspite of his Leibnizian sympathies, Robinson regards Berkeley's criticisms of the infinitesimal calculus as aptly demonstrating the inconsistency of reasoning with historical infinitesimal magnitudes. We argue that Robinson, among others, overestimates the force of Berkeley's criticisms, by underestimating the mathematical and philosophical resources available to Leibniz. Leibniz's infinitesimals are fictions, not logical fictions, as Ishiguro proposed, but rather pure fictions, like imaginaries, which are not eliminable by some syncategorematic paraphrase. We argue that Leibniz's defense of infinitesimals is more firmly grounded than Berkeley's criticism thereof. We show, moreover, that Leibniz's system for differential calculus was free of logical fallacies. Our argument strengthens the conception of modern infinitesimals as a development of Leibniz's strategy of relating inassignable to assignable quantities by means of his transcendental law of homogeneity.Comment: 69 pages, 3 figure

    Individual biases, cultural evolution, and the statistical nature of language universals: the case of colour naming systems

    Get PDF
    Language universals have long been attributed to an innate Universal Grammar. An alternative explanation states that linguistic universals emerged independently in every language in response to shared cognitive or perceptual biases. A computational model has recently shown how this could be the case, focusing on the paradigmatic example of the universal properties of colour naming patterns, and producing results in quantitative agreement with the experimental data. Here we investigate the role of an individual perceptual bias in the framework of the model. We study how, and to what extent, the structure of the bias influences the corresponding linguistic universal patterns. We show that the cultural history of a group of speakers introduces population-specific constraints that act against the pressure for uniformity arising from the individual bias, and we clarify the interplay between these two forces

    How to arrange a Singles Party

    Get PDF
    The study addresses important question regarding the computational aspect of coalition formation. Almost as well known to find payoffs (imputations) belonging to a core, is prohibitively difficult, NP-hard task even for modern super-computers. In addition to the difficulty, the task becomes uncertain as it is unknown whether the core is non-empty. Following Shapley (1971), our Singles Party Game is convex, thus the presence of non-empty core is fully guaranteed. The article introduces a concept of coalitions, which are called nebulouses, adequate to critical coalitions, Mullat (1979). Nebulouses represent coalitions minimal by inclusion among all coalitions assembled into a semi-lattice of sets or kernels of "Monotone System," Mullat (1971,1976,1995), Kuznetsov et al. (1982). An equivalent property to convexity, i.e., the monotonicity of the singles game allowed creating an effective procedure for finding the core by polynomial algorithm, a version of P-NP problem. Results are illustrated by MS Excel spreadsheet

    An overview of lexicographic choice under uncertainty

    Full text link
    This overview focuses on lexicographic choice under conditions of uncertainty. First, lexicographic versions of traditional (von Neumann-Morgenstern) expected utility theory are described where the usual Archimedean axiom is weakened. The role of these lexicographic variants in explaining some well-known “paradoxes” of choice theory is reviewed. Next, the significance of lexicographic choice for game theory is discussed. Finally, some lexicographic extensions of the classical maximin decision rule are described.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44147/1/10479_2005_Article_BF02283523.pd
    • …
    corecore